10 research outputs found

    Artificial intelligence and UK national security: Policy considerations

    Get PDF
    RUSI was commissioned by GCHQ to conduct an independent research study into the use of artificial intelligence (AI) for national security purposes. The aim of this project is to establish an independent evidence base to inform future policy development regarding national security uses of AI. The findings are based on in-depth consultation with stakeholders from across the UK national security community, law enforcement agencies, private sector companies, academic and legal experts, and civil society representatives. This was complemented by a targeted review of existing literature on the topic of AI and national security. The research has found that AI offers numerous opportunities for the UK national security community to improve efficiency and effectiveness of existing processes. AI methods can rapidly derive insights from large, disparate datasets and identify connections that would otherwise go unnoticed by human operators. However, in the context of national security and the powers given to UK intelligence agencies, use of AI could give rise to additional privacy and human rights considerations which would need to be assessed within the existing legal and regulatory framework. For this reason, enhanced policy and guidance is needed to ensure the privacy and human rights implications of national security uses of AI are reviewed on an ongoing basis as new analysis methods are applied to data

    The Rapid Rise of Generative AI: Assessing risks to safety and security

    Get PDF
    This CETaS Research Report presents the findings from a major project exploring the implications of generative AI for national security. It is based on extensive engagement with more than 50 experts across government, academia, industry, and civil society, and represents the most comprehensive UK-based study to date on the national security implications of generative AI. The research found that generative AI could significantly amplify a range of digital, physical and political security risks. With the rapid proliferation of generative AI tools across the economy, the national security community needs to shift its mindset to account for all the unintentional or incidental ways in which generative AI can pose security risks, in addition to intentionally malicious uses. The report provides recommendations to effectively mitigate the security risks posed by generative AI, calling for a new multi-layered, socio-technical approach to system evaluation

    Observing Data-Driven Approaches to Covid-19: Reflections from a Distributed, Remote, Interdisciplinary Research Project

    Get PDF
    The Observatory for Monitoring Data-Driven Approaches to Covid-19 (OMDDAC) is an Arts and Humanities Research Council funded research project investigating data-driven approaches to Covid-19, focused upon legal, ethical, policy and operational challenges. The project is a collaboration between Northumbria University (Law School, Department of Computing and Information Sciences, Department of Mathematics) and the Royal United Services Institute, a defence and security think-tank, and aims to carry out integrated interdisciplinary research, regarded as the most challenging type of interdisciplinarity but where the outputs can be the most impactful. Due to the constraints of the pandemic, the project has been carried out in a fully distributed and remote manner, with some team members never having met in person. The subject of the research is continually changing and developing, creating unique project management issues, with the impact of the pandemic pervasive in the lives of the researchers. This article takes the form of a series of reflections from the points of view of individual project researchers – the specialist legal researcher, the think-tank Co-Investigator, the post-doctoral researcher, statistical and data science researchers, and the Principal Investigator – and organised under two main themes - project management and internal communication; and methodologies/interdisciplinary research. We thus draw out lessons for future remote and distributed research, focused upon interdisciplinarity, the benefits and challenges of remote research methodologies, and issues of collegiality. Finally, we warn that it will be a false economy for universities and funders to assume that research projects can continue to be conducted in a mainly remote manner and therefore, that budgetary savings can be made by reducing time allocations, travel and academic networking

    OMDDAC Snapshot Report 2: Tech-driven approaches to Public Health

    Get PDF
    This Snapshot Report incorporates OMDDAC’s findings from interviews with key stakeholders, together with published research, to capture the experiences and lessons learned throughout the pandemic in relation to technology-driven approaches to public health. This Report examines three case studies: digital proximity and exposure notification; risk scoring algorithms; and Covid-status certification

    OMDDAC Snapshot Report 1: Data-driven Public Policy

    Get PDF
    ‘Data-driven’ decision-making has been at the heart of the response to Covid-19 in the UK. Data-driven approaches include: sharing, linkage and analysis of different datasets from various sources; predictive modelling to anticipate and understand transmission and inform policy; and data-driven profiling to identify and support vulnerable individuals. This Snapshot Report incorporates OMDDAC’s findings from interviews with stakeholders, together with published research, to capture the lessons learned throughout the pandemic across these three case studies

    OMDDAC Snapshot Report 3: Policing and Public Safety

    Get PDF
    Policing during a pandemic brings novel data-driven challenges. Solving them requires significant coordination and clear communication both within forces and across public sector agencies. This report presents three case studies demonstrating the range of opportunities and difficulties facing the police in this period: police access to NHS Test and Trace data; monitoring of crime and enforcement trends; and monitoring of police resourcing and wellbeing

    Data-Driven Responses to COVID-19: Lessons Learned: OMDDAC Research Compendium

    Get PDF
    Funded by the Arts and Humanities Research Council under the UKRI COVID-19 Rapid Response call, the Observatory for Monitoring Data-Driven Approaches to COVID-19 (OMDDAC) is a collaboration between Northumbria University and the Royal United Services Institute (RUSI). This project has involved a multidisciplinary team of researchers (with expertise in the law on technology, data protection, and medicine as well as practical ethics, computer science, data science, applied statistics in health, technology and security studies and behavioural science) to investigate the legal, ethical, policy and operationalchallenges encountered in relation to key data-driven responses to the pandemic.The COVID-19 pandemic has accelerated the consideration of several priorities in the data and technology space, which are reflected in the UK Government’s present strategies. The National Data Strategy, in particular, pledges to take account of the lessons learned from the COVID-19 response and draw uponthe UK’s values of transparency, accountability and inclusion. Seeking to inform the lessons learned from the pandemic, the project used a mixed-methods research design that included case study analysis, interviews with key stakeholders (individuals with relevant expertise and/or experience in relation to the data-driven pandemic response), representative public surveys, and engagement with young people through a children’s rights charity. OMDDAC has published four snapshot reports focused on data-driven public policy, tech-driven approaches to public health, policing and public safety and key findings from the public perceptions survey. The emerging issues identified in those reports align closely with the four pillars of the National Data Strategy, which form the framework for this final project report:1. Data Foundations (data quality issues and infrastructure);2. Data Skills (data literacy of decision-makers);3. Data Availability (data sharing); and4. Responsibility (law, ethics, transparency, and public trust)

    OMDDAC Practitioner Guidelines

    Get PDF
    These practitioner guidelines are presented by the AHRC funded ‘Observatory for Monitoring Data-Driven Approaches to COVID-19’ (OMDDAC) project. OMDDAC is a collaboration between Northumbria University and the Royal United Services Institute (RUSI), researching the data-driven approaches to COVID-19, with a focus upon legal, ethical, policy and operational challenges. OMDDAC has analysed key data-driven responses to COVID-19, collating lessons learned in ‘real time’ throughout the pandemic by way of representative public surveys, case study analysis and interviews with key stakeholders from a range of sectors (including local and central government, regulators, law enforcement, the medical and legal profession, charities and the third sector, the private sector, and an interdisciplinary range of academics). These practitioner guidelines have been informed by our research findings. The guidelines are relevant specifically to practitioners who work with data in the health and social care sector and in the law enforcement sector

    Privacy Intrusion and National Security in the Age of AI: Assessing Proportionality of Automated Analytics

    No full text
    This CETaS Research Report explores the complex issue of privacy intrusion arising from the use of automated analytics, with specific focus on artificial intelligence (AI). The research focuses on UK national security and law enforcement agencies with access to legal powers that incur some degree of intrusion into individuals’ private lives. As automated methods are increasingly deployed to process the data collected through the use of such powers, there is a need to understand the additional privacy considerations that could arise as a result of this automated processing. The report’s ultimate objective is to develop a structured analytical framework for assessing proportionality of privacy intrusion arising from the use of automated analytics. The framework considers the whole lifecycle of automated analytics, including data collection, training, testing processes, and use. It aims to introduce a common language and taxonomy that will assist stakeholders in identifying, comparing, and assessing the potential impact of relevant privacy considerations in a structured and evidence-based way. This framework is not intended to replace any existing authorisation or compliance processes, but rather to provide an additional layer of rigour and assurance to supplement and futureproof existing processes. The research is informed by semi-structured interviews and focus groups with stakeholders across the UK government, national security and law enforcement, and legal experts outside government, as well as an understanding of the literature on proportionality in English law and critiques of the application of the proportionality test. Particular attention is paid to the distinctive aspects of automated processes and artificial intelligence, and the requirements for making a structured analytical framework useful in practice

    Implications of Emerging Privacy Enhancing Technologies for UK Surveillance Policy

    Get PDF
    This report aims to establish an independent evidence base to inform future government policy development and strategic thinking regarding national security uses of Privacy Enhancing Technologies (PETs). The findings are based on in-depth consultation with stakeholders across the UK Intelligence Community (UKIC), Investigatory Powers Commissioner’s Office (IPCO) and academic experts.The research has identified new opportunities for the UKIC and partner organisations to apply PETs in three main areas: data acquisition and information requests; secure machine learning; and non-operational data sharing. These opportunities possess high potential on the condition that the UKIC can establish clarity regarding the motivation for using a PET in each circumstance and the safeguards to be applied.The research concluded that PETs may provide a less intrusive means of carrying out intelligence work. However, in each case the motivation for using the PET must be clear to inform the legal considerations governing its use. PETs could also encourage the sharing of knowledge and capabilities between the UKIC and external partners, although the successful use of PETs in this context will be closely linked to the levels of cooperation from third-party data holders.There is a clear risk to deploying PETs without having developed appropriate levels of trust. Future policy will need to account for these concerns, to ensure that organisations retain appropriate knowledge and control over the analytical process deployed by the PET, thereby ensuring trust and accountability throughout the full analysis pipeline
    corecore